Distilbert Base German Cased
Apache-2.0
This is a lightweight BERT model optimized for German, retaining most of the original BERT model's performance through knowledge distillation while significantly reducing model size and computational requirements.
Large Language Model
Transformers German